IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES An Information-Theoretic Perspective of the Poisson Approximation via the Chen- Stein Method

نویسنده

  • Igal Sason
چکیده

The first part of this work considers the entropy of the sum of (possibly dependent and non-identically distributed) Bernoulli random variables. Upper bounds on the error that follows from an approximation of this entropy by the entropy of a Poisson random variable with the same mean are derived via the Chen-Stein method. The second part of this work derives new lower bounds on the total variation distance and relative entropy between the distribution of the sum of independent Bernoulli random variables and the Poisson distribution. The starting point of the derivation of the new bounds in the second part of this work is an introduction of a new lower bound on the total variation distance, whose derivation generalizes and refines the analysis by Barbour and Hall (1984), based on the Chen-Stein method for the Poisson approximation. A new lower bound on the relative entropy between these two distributions is introduced, and this lower bound is compared to a previously reported upper bound on the relative entropy by Kontoyiannis et al. (2005). The derivation of the new lower bound on the relative entropy follows from the new lower bound on the total variation distance, combined with a distribution-dependent refinement of Pinsker’s inequality by Ordentlich and Weinberger (2005). Upper and lower bounds on the Bhattacharyya parameter, Chernoff information and Hellinger distance between the distribution of the sum of independent Bernoulli random variables and the Poisson distribution with the same mean are derived as well via some relations between these quantities with the total variation distance and the relative entropy. The analysis in this work combines elements of information theory with the Chen-Stein method for the Poisson approximation. The resulting bounds are easy to compute, and their applicability is exemplified. Index Terms Chen-Stein method, Chernoff information, entropy, error bounds, error exponents, Poisson approximation, relative entropy, total variation distance. AMS 2000 Subject Classification: Primary 60E07, 60E15, 60G50, 94A17.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES Improved Lower Bounds on the Total Variation Distance for the Poisson Approximation

New lower bounds on the total variation distance between the distribution of a sum of independent Bernoulli random variables and the Poisson random variable (with the same mean) are derived via the Chen-Stein method. The new bounds rely on a non-trivial modification of the analysis by Barbour and Hall (1984) which surprisingly gives a significant improvement. A use of the new lower bounds is ad...

متن کامل

IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES Normal Approximations of Geodesics on Smooth Triangulated Surfaces

We introduce a novel method for the approximation of shortest geodesics on smooth triangulated surfaces. The method relies on the theory of normal curves on triangulated surfaces and their relations with geodesics. We also relate in this work to normal surfaces and comment on the possible extension of the method for finding minimal surfaces inside 3-manifolds.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012